Current Issue : October - December Volume : 2016 Issue Number : 4 Articles : 6 Articles
Background: Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring\nupper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may\nnot always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to\nimprove BMI performance. We describe a method of shared control where the user controls a prosthetic arm using\na BMI and receives assistance with positioning the hand when it approaches an object.\nMethods: Two human subjects with tetraplegia used a robotic arm to complete object transport tasks with and\nwithout shared control. The shared control system was designed to provide a balance between BMI-derived\nintention and computer assistance. An autonomous robotic grasping system identified and tracked objects and\ndefined stable grasp positions for these objects. The system identified when the user intended to interact with an\nobject based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled\nmovements and autonomous grasping commands were blended to ensure secure grasps.\nResults: Both subjects were more successful on object transfer tasks when using shared control compared to\nBMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult.\nOne participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in\n92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control.\nConclusions: Integration of BMI control with vision-guided robotic assistance led to improved performance on\nobject transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive\nto potential users....
Grounded language acquisition is an important issue, particularly to facilitate human-robot interactions in an intelligent and\neffective way. The evolutionary and developmental language acquisition are two innovative and important methodologies for the\ngrounding of language in cognitive agents or robots, the aim of which is to address current limitations in robot design. This paper\nconcentrates on these two main modelling methods with the grounding principle for the acquisition of linguistic ability in cognitive\nagents or robots. This review not only presents a survey of the methodologies and relevant computational cognitive agents or robotic\nmodels, but also highlights the advantages and progress of these approaches for the language grounding issue....
The aim of this study is to present electrooculogram (EOG) and surface electromyogram (sEMG) signals that can be used as\na human-computer interface. Establishing an efficient alternative channel for communication without overt speech and hand\nmovements is important for increasing the quality of life for patients suffering from amyotrophic lateral sclerosis, muscular\ndystrophy, or other illnesses. In this paper, we propose an EOG-sEMG human-computer interface system for communication using\nboth cross-channels and parallel lines channels on the face with the same electrodes. This system could record EOG and sEMG\nsignals as ââ?¬Å?dual-modalityââ?¬Â for pattern recognition simultaneously. Although as much as 4 patterns could be recognized, dealing\nwith the state of the patients, we only choose two classes (left and right motion) of EOG and two classes (left blink and right blink)\nof sEMG which are easily to be realized for simulation and monitoring task. From the simulation results, our system achieved\nfour-pattern classification with an accuracy of 95.1%....
We perceive that some Brain-Computer Interface (BCI) researchers believe in totally different origins\nof invasive and non-invasive electrical BCI signals. Based on available literature we argue,\nhowever, that although invasive and non-invasive BCI signals are different, the underlying origin\nof electrical BCIs signals is the same....
The sense of agency is the experience of controlling both oneââ?¬â?¢s body and the external\nenvironment. Although the sense of agency has been studied extensively, there is a\npaucity of studies in applied ââ?¬Å?real-lifeââ?¬Â situations. One applied domain that seems highly\nrelevant is human-computer-interaction (HCI), as an increasing number of our everyday\nagentive interactions involve technology. Indeed, HCI has long recognized the feeling of\ncontrol as a key factor in how people experience interactions with technology. The aim of\nthis review is to summarize and examine the possible links between sense of agency and\nunderstanding control in HCI. We explore the overlap between HCI and sense of agency\nfor computer input modalities and system feedback, computer assistance, and joint actions\nbetween humans and computers. An overarching consideration is how agency research\ncan inform HCI and vice versa. Finally, we discuss the potential ethical implications of\npersonal responsibility in an ever-increasing society of technology users and intelligent\nmachine interfaces....
Several groups have developed brain-machine-interfaces (BMIs) that allow primates to use cortical\n \n\n \n \n \n \n\n\n\n\n \n \n\n\n\n \n \n\n\n\n \n\nthe kinematics of whole-body navigation and be used to operate a BMI that moves a wheelchair\ncontinuously in space. Here we show that rhesus monkeys can learn to navigate a robotic wheelchair,\nusing their cortical activity as the main control signal. Two monkeys were chronically implanted with\nmultichannel microelectrode arrays that allowed wireless recordings from ensembles of premotor and\nsensorimotor cortical neurons. Initially, while monkeys remained seated in the robotic wheelchair,\n\n\n\n! \n\n\n\n \n \n\n\n\n\n\n \n\" \n#\n\n\n\n \n\n$\n\ncortical activity. Next, monkeys employed the wireless BMI to translate their cortical activity into the\nrobotic wheelchair�s translational and rotational velocities. Over time, monkeys improved their ability\nto navigate the wheelchair toward the location of a grape reward. The navigation was enacted by\npopulations of cortical neurons tuned to whole-body displacement. During practice with the apparatus,\nwe also noticed the presence of a cortical representation of the distance to reward location. These\nresults demonstrate that intracranial BMIs could restore whole-body mobility to severely paralyzed\npatients in the future....
Loading....